Transfer Entropy and Directed Information in Gaussian diffusion processes

نویسنده

  • Nigel J. Newton
چکیده

Transfer Entropy and Directed Information are information-theoretic measures of the directional dependency between stochastic processes. Following the definitions of Schreiber and Massey in discrete time, we define and evaluate these measures for the components of multidimensional Gaussian diffusion processes. When the components are jointly Markov, the Transfer Entropy and Directed Information are both measures of influence according to a simple physical principle. More generally, the effect of other components has to be accounted for, and this can be achieved in more than one way. We propose two definitions, one of which preserves the properties of influence of the jointly Markov case. The Transfer Entropy and Directed Information are expressed in terms of the solutions of matrix Riccati equations, and so are easy to compute. The definition of continuous-time Directed Information we propose differs from that previously appearing in the literature. We argue that the latter is not strictly directional.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

ADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes

In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...

متن کامل

Relating Granger causality to directed information theory for networks of stochastic processes

This paper addresses the problem of inferring circulation of information between multiple stochastic processes. We discuss two possible frameworks in which the problem can be studied: directed information theory and Granger causality. The main goal of the paper is to study the connection between these two frameworks. In the case of directed information theory, we stress the importance of Kramer...

متن کامل

Granger causality and transfer entropy are equivalent for Gaussian variables.

Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained tract...

متن کامل

Causal Network Inference by Optimal Causation Entropy

The broad abundance of time series data, which is in sharp contrast to limited knowledge of the underlying network dynamic processes that produce such observations, calls for an general and efficient method of causal network inference. Here we develop mathematical theory of Causation Entropy, a model-free information-theoretic statistic designed for causality inference. We prove that for a give...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1604.01969  شماره 

صفحات  -

تاریخ انتشار 2016